66 research outputs found

    Functional encryption based approaches for practical privacy-preserving machine learning

    Get PDF
    Machine learning (ML) is increasingly being used in a wide variety of application domains. However, deploying ML solutions poses a significant challenge because of increasing privacy concerns, and requirements imposed by privacy-related regulations. To tackle serious privacy concerns in ML-based applications, significant recent research efforts have focused on developing privacy-preserving ML (PPML) approaches by integrating into ML pipeline existing anonymization mechanisms or emerging privacy protection approaches such as differential privacy, secure computation, and other architectural frameworks. While promising, existing secure computation based approaches, however, have significant computational efficiency issues and hence, are not practical. In this dissertation, we address several challenges related to PPML and propose practical secure computation based approaches to solve them. We consider both two-tier cloud-based and three-tier hybrid cloud-edge based PPML architectures and address both emerging deep learning models and federated learning approaches. The proposed approaches enable us to outsource data or update a locally trained model in a privacy-preserving manner by employing computation over encrypted datasets or local models. Our proposed secure computation solutions are based on functional encryption (FE) techniques. Evaluation of the proposed approaches shows that they are efficient and more practical than existing approaches, and provide strong privacy guarantees. We also address issues related to the trustworthiness of various entities within the proposed PPML infrastructures. This includes a third-party authority (TPA) which plays a critical role in the proposed FE-based PPML solutions, and cloud service providers. To ensure that such entities can be trusted, we propose a transparency and accountability framework using blockchain. We show that the proposed transparency framework is effective and guarantees security properties. Experimental evaluation shows that the proposed framework is efficient

    Liquid Democracy in DPoS Blockchains

    Full text link
    Voting mechanisms play a crucial role in decentralized governance of blockchain systems. Liquid democracy, also known as delegative voting, allows voters to vote directly or delegate their voting power to others, thereby contributing to the resolution of problems such as low voter turnout. In recent years, liquid democracy has been widely adopted by Delegated-Proof-of-Stake (DPoS) blockchains and implemented successfully on platforms with millions of users. However, little is known regarding the characteristics and actual effectiveness of liquid democracy in decentralized governance. This paper explored for the first time the practical implementation of liquid democracy in DPoS blockchain systems. Using actual data collected from two major DPoS blockchains, EOS and Steem, our study compared and evaluated the participation of different types of users of DPoS blockchain systems in liquid democracy, as well as extracting and analyzing the delegation chains and networks formed during the process of liquid democracy within the systems. We believe that the findings of this paper will contribute to further studies on the design and implementation of liquid democracy and other voting mechanisms in decentralized governance

    T3AB: Transparent and Trustworthy Third-party Authority using Blockchain

    Full text link
    Increasingly, information systems rely on computational, storage, and network resources deployed in third-party facilities or are supported by service providers. Such an approach further exacerbates cybersecurity concerns constantly raised by numerous incidents of security and privacy attacks resulting in data leakage and identity theft, among others. These have in turn forced the creation of stricter security and privacy related regulations and have eroded the trust in cyberspace. In particular, security related services and infrastructures such as Certificate Authorities (CAs) that provide digital certificate service and Third-Party Authorities (TPAs) that provide cryptographic key services, are critical components for establishing trust in Internet enabled applications and services. To address such trust issues, various transparency frameworks and approaches have been recently proposed in the literature. In this paper, we propose a Transparent and Trustworthy TPA using Blockchain (T3AB) to provide transparency and accountability to the trusted third-party entities, such as honest-but-curious third-party IaaS servers, and coordinators in various privacy-preserving machine learning (PPML) approaches. T3AB employs the Ethereum blockchain as the underlying public ledger and also includes a novel smart contract to automate accountability with an incentive mechanism that motivates participants' to participate in auditing, and punishes unintentional or malicious behaviors. We implement T3AB, and show through experimental evaluation in the Ethereum official test network, Rinkeby, that the framework is efficient. We also formally show the security guarantee provided by T3AB, and analyze the privacy guarantee and trustworthiness it provides

    Reality-Mining with Smartphones: Detecting and Predicting Life Events based on App Installation Behavior

    Get PDF
    Life events are often described as major forces that are going to shape tomorrow\u27s consumer need, behavior and mood. Thus, the prediction of life events is highly relevant in marketing and sociology. In this paper, we propose a data-driven, real-time method to predict individual life events, using readily available data from smartphones. Our large-scale user study with more than 2000 users shows that our method is able to predict life events with 64.5% higher accuracy, 183.1% better precision and 88.0% higher specificity than a random model on average

    Scalable and Privacy-preserving Design of On/Off-chain Smart Contracts

    Full text link
    The rise of smart contract systems such as Ethereum has resulted in a proliferation of blockchain-based decentralized applications including applications that store and manage a wide range of data. Current smart contracts are designed to be executed solely by miners and are revealed entirely on-chain, resulting in reduced scalability and privacy. In this paper, we discuss that scalability and privacy of smart contracts can be enhanced by splitting a given contract into an off-chain contract and an on-chain contract. Specifically, functions of the contract that involve high-cost computation or sensitive information can be split and included as the off-chain contract, that is signed and executed by only the interested participants. The proposed approach allows the participants to reach unanimous agreement off-chain when all of them are honest, allowing computing resources of miners to be saved and content of the off-chain contract to be hidden from the public. In case of a dispute caused by any dishonest participants, a signed copy of the off-chain contract can be revealed so that a verified instance can be created to make miners enforce the true execution result. Thus, honest participants have the ability to redress and penalize any fraudulent or dishonest behavior, which incentivizes all participants to honestly follow the agreed off-chain contract. We discuss techniques for splitting a contract into a pair of on/off-chain contracts and propose a mechanism to address the challenges of handling dishonest participants in the system. Our implementation and evaluation of the proposed approach using an example smart contract demonstrate the effectiveness of the proposed approach in Ethereum

    Cross-Consensus Measurement of Individual-level Decentralization in Blockchains

    Full text link
    Decentralization is widely recognized as a crucial characteristic of blockchains that enables them to resist malicious attacks such as the 51% attack and the takeover attack. Prior research has primarily examined decentralization in blockchains employing the same consensus protocol or at the level of block producers. This paper presents the first individual-level measurement study comparing the decentralization of blockchains employing different consensus protocols. To facilitate cross-consensus evaluation, we present a two-level comparison framework and a new metric. We apply the proposed methods to Ethereum and Steem, two representative blockchains for which decentralization has garnered considerable interest. Our findings dive deeper into the level of decentralization, suggest the existence of centralization risk at the individual level in Steem, and provide novel insights into the cross-consensus comparison of decentralization in blockchains

    HybridAlpha: An Efficient Approach for Privacy-Preserving Federated Learning

    Full text link
    Federated learning has emerged as a promising approach for collaborative and privacy-preserving learning. Participants in a federated learning process cooperatively train a model by exchanging model parameters instead of the actual training data, which they might want to keep private. However, parameter interaction and the resulting model still might disclose information about the training data used. To address these privacy concerns, several approaches have been proposed based on differential privacy and secure multiparty computation (SMC), among others. They often result in large communication overhead and slow training time. In this paper, we propose HybridAlpha, an approach for privacy-preserving federated learning employing an SMC protocol based on functional encryption. This protocol is simple, efficient and resilient to participants dropping out. We evaluate our approach regarding the training time and data volume exchanged using a federated learning process to train a CNN on the MNIST data set. Evaluation against existing crypto-based SMC solutions shows that HybridAlpha can reduce the training time by 68% and data transfer volume by 92% on average while providing the same model performance and privacy guarantees as the existing solutions.Comment: 12 pages, AISec 201
    • …
    corecore